Circular Backpropagation Networks for Classification - Neural Networks, IEEE Transactions on
نویسندگان
چکیده
The class of mapping networks is a general family of tools to perform a wide variety of tasks; however, no unifying framework exists to describe their theoretical and practical properties. This paper presents a standardized, uniform representation for this class of networks, and introduces a simple modification of the multilayer perceptron with interesting practical properties, especially well suited to cope with pattern classification tasks. The proposed model unifies the two main representation paradigms found in the class of mapping networks for classification, namely, the surface-based and the prototype-based schemes, while retaining the advantage of being trainable by backpropagation. The enhancement in the representation properties and the generalization performance are assessed through results about the worst-case requirement in terms of hidden units and about the Vapnik–Chervonenkis dimension and Cover capacity. The theoretical properties of the network also suggest that the proposed modification to the multilayer perceptron is in many senses optimal. A number of experimental verifications also confirm theoretical results about the model’s increased performances, as compared with the multilayer perceptron and the Gaussian radial basis functions network.
منابع مشابه
Circular backpropagation networks embed vector quantization
This letter proves the equivalence between vector quantization (VQ) classifiers and circular backpropagation (CBP) networks. The calibrated prototypes for a VQ schema can be plugged in a CBP feedforward structure having the same number of hidden neurons and featuring the same mapping. The letter describes how to exploit such equivalence by using VQ prototypes to perform a meaningful initializat...
متن کاملAnalog design of a new neural network for optical character recognition
An electronic circuit is presented for a new type of neural network, which gives a recognition rate of over 100 kHz. The network is used to classify handwritten numerals, presented as Fourier and wavelet descriptors, and has been shown to train far quicker than the popular backpropagation network while maintaining classification accuracy.
متن کاملBackpropagation Learning for Multi-layer Feed-forward Neural Networks Using the Conjugate Gradient Method. Ieee Transactions on Neural Networks, 1991. [31] M. F. Mller. a Scaled Conjugate Gradient Algorithm for Fast Supervised Learning. Technical Report Pb-339
متن کامل
Nonlinear backpropagation: doing backpropagation without derivatives of the activation function
The conventional linear backpropagation algorithm is replaced by a nonlinear version, which avoids the necessity for calculating the derivative of the activation function. This may be exploited in hardware realizations of neural processors. In this paper we derive the nonlinear backpropagation algorithms in the framework of recurrent backpropagation and present some numerical simulations of fee...
متن کاملOptimal convergence of on-line backpropagation
Many researchers are quite skeptical about the actual behavior of neural network learning algorithms like backpropagation. One of the major problems is with the lack of clear theoretical results on optimal convergence, particularly for pattern mode algorithms. In this paper, we prove the companion of Rosenblatt's PC (perceptron convergence) theorem for feedforward networks (1960), stating that ...
متن کامل